Search results for "Rejection sampling"

showing 10 items of 14 documents

Group Metropolis Sampling

2017

Monte Carlo (MC) methods are widely used for Bayesian inference and optimization in statistics, signal processing and machine learning. Two well-known class of MC methods are the Importance Sampling (IS) techniques and the Markov Chain Monte Carlo (MCMC) algorithms. In this work, we introduce the Group Importance Sampling (GIS) framework where different sets of weighted samples are properly summarized with one summary particle and one summary weight. GIS facilitates the design of novel efficient MC techniques. For instance, we present the Group Metropolis Sampling (GMS) algorithm which produces a Markov chain of sets of weighted samples. GMS in general outperforms other multiple try schemes…

Computer scienceMonte Carlo methodMarkov processSlice samplingProbability density function02 engineering and technologyMultiple-try MetropolisBayesian inferenceMachine learningcomputer.software_genre01 natural sciencesHybrid Monte Carlo010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicsComputingMilieux_MISCELLANEOUSMarkov chainbusiness.industryRejection samplingSampling (statistics)020206 networking & telecommunicationsMarkov chain Monte CarloMetropolis–Hastings algorithmsymbolsMonte Carlo method in statistical physicsMonte Carlo integrationArtificial intelligencebusinessParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingcomputerAlgorithmImportance samplingMonte Carlo molecular modeling
researchProduct

Recycling Gibbs sampling

2017

Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning and statistics. The key point for the successful application of the Gibbs sampler is the ability to draw samples from the full-conditional probability density functions efficiently. In the general case this is not possible, so in order to speed up the convergence of the chain, it is required to generate auxiliary samples. However, such intermediate information is finally disregarded. In this work, we show that these auxiliary samples can be recycled within the Gibbs estimators, improving their efficiency with no extra cost. Theoretical and exhaustive numerical co…

Computer scienceMonte Carlo methodSlice samplingMarkov processProbability density function02 engineering and technologyMachine learningcomputer.software_genre01 natural sciencesHybrid Monte Carlo010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicsComputingMilieux_MISCELLANEOUSbusiness.industryRejection samplingEstimator020206 networking & telecommunicationsMarkov chain Monte CarlosymbolsArtificial intelligencebusiness[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingcomputerAlgorithmGibbs sampling2017 25th European Signal Processing Conference (EUSIPCO)
researchProduct

Parsimonious adaptive rejection sampling

2017

Monte Carlo (MC) methods have become very popular in signal processing during the past decades. The adaptive rejection sampling (ARS) algorithms are well-known MC technique which draw efficiently independent samples from univariate target densities. The ARS schemes yield a sequence of proposal functions that converge toward the target, so that the probability of accepting a sample approaches one. However, sampling from the proposal pdf becomes more computationally demanding each time it is updated. We propose the Parsimonious Adaptive Rejection Sampling (PARS) method, where an efficient trade-off between acceptance rate and proposal complexity is obtained. Thus, the resulting algorithm is f…

FOS: Computer and information sciencesSignal processingSequenceComputer science020208 electrical & electronic engineeringMonte Carlo methodRejection samplingUnivariateSampling (statistics)020206 networking & telecommunicationsSample (statistics)02 engineering and technologyStatistics - ComputationAdaptive filter0202 electrical engineering electronic engineering information engineeringElectrical and Electronic EngineeringAlgorithmComputation (stat.CO)Electronics Letters
researchProduct

Grapham: Graphical models with adaptive random walk Metropolis algorithms

2008

Recently developed adaptive Markov chain Monte Carlo (MCMC) methods have been applied successfully to many problems in Bayesian statistics. Grapham is a new open source implementation covering several such methods, with emphasis on graphical models for directed acyclic graphs. The implemented algorithms include the seminal Adaptive Metropolis algorithm adjusting the proposal covariance according to the history of the chain and a Metropolis algorithm adjusting the proposal scale based on the observed acceptance probability. Different variants of the algorithms allow one, for example, to use these two algorithms together, employ delayed rejection and adjust several parameters of the algorithm…

FOS: Computer and information sciencesStatistics and ProbabilityMarkov chainAdaptive algorithmApplied MathematicsRejection samplingMarkov chain Monte CarloMultiple-try MetropolisStatistics - ComputationStatistics::ComputationComputational Mathematicssymbols.namesakeMetropolis–Hastings algorithmComputational Theory and MathematicssymbolsGraphical modelAlgorithmComputation (stat.CO)MathematicsGibbs samplingComputational Statistics & Data Analysis
researchProduct

Exact simulation of first exit times for one-dimensional diffusion processes

2019

International audience; The simulation of exit times for diffusion processes is a challenging task since it concerns many applications in different fields like mathematical finance, neuroscience, reliability horizontal ellipsis The usual procedure is to use discretization schemes which unfortunately introduce some error in the target distribution. Our aim is to present a new algorithm which simulates exactly the exit time for one-dimensional diffusions. This acceptance-rejection algorithm requires to simulate exactly the exit time of the Brownian motion on one side and the Brownian position at a given time, constrained not to have exit before, on the other side. Crucial tools in this study …

Girsanov theoremand phrases: Exit timeDiscretizationsecondary: 65N75Exit time Brownian motion diffusion processes Girsanov’s transformation rejection sampling exact simulation randomized algorithm conditioned Brownian motion.MSC 65C05 65N75 60G40Exit time01 natural sciencesGirsanov’s transformationrandomized algorithm010104 statistics & probabilityrejection samplingGirsanov's transformationexact simulationFOS: MathematicsApplied mathematicsMathematics - Numerical Analysis0101 mathematicsConvergent seriesBrownian motion60G40MathematicsNumerical AnalysisApplied MathematicsMathematical financeRejection samplingProbability (math.PR)diffusion processesNumerical Analysis (math.NA)conditioned Brownian motionRandomized algorithm010101 applied mathematics[MATH.MATH-PR]Mathematics [math]/Probability [math.PR]Computational MathematicsModeling and Simulationconditioned Brownian motion 2010 AMS subject classifications: primary 65C05Brownian motionRandom variableMathematics - ProbabilityAnalysis[MATH.MATH-NA]Mathematics [math]/Numerical Analysis [math.NA]
researchProduct

Avoiding Boundary Effects in Wang-Landau Sampling

2003

A simple modification of the ``Wang-Landau sampling'' algorithm removes the systematic error that occurs at the boundary of the range of energy over which the random walk takes place in the original algorithm.

Heterogeneous random walk in one dimensionStatistical Mechanics (cond-mat.stat-mech)Rejection samplingFOS: Physical sciencesSlice samplingSampling (statistics)Boundary (topology)Random walk01 natural sciences010305 fluids & plasmasCombinatorics0103 physical sciencesRange (statistics)Applied mathematics010306 general physicsEnergy (signal processing)Condensed Matter - Statistical MechanicsMathematics
researchProduct

A new strategy for effective learning in population Monte Carlo sampling

2016

In this work, we focus on advancing the theory and practice of a class of Monte Carlo methods, population Monte Carlo (PMC) sampling, for dealing with inference problems with static parameters. We devise a new method for efficient adaptive learning from past samples and weights to construct improved proposal functions. It is based on assuming that, at each iteration, there is an intermediate target and that this target is gradually getting closer to the true one. Computer simulations show and confirm the improvement of the proposed strategy compared to the traditional PMC method on a simple considered scenario.

Mathematical optimizationComputer scienceMonte Carlo methodInference02 engineering and technology01 natural sciencesHybrid Monte Carlo010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineeringQuasi-Monte Carlo methodKinetic Monte Carlo0101 mathematicsComputingMilieux_MISCELLANEOUSbusiness.industryRejection samplingSampling (statistics)020206 networking & telecommunicationsMarkov chain Monte CarloDynamic Monte Carlo methodsymbolsMonte Carlo integrationMonte Carlo method in statistical physicsArtificial intelligenceParticle filterbusiness[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingMonte Carlo molecular modeling
researchProduct

Anti-tempered Layered Adaptive Importance Sampling

2017

Monte Carlo (MC) methods are widely used for Bayesian inference in signal processing, machine learning and statistics. In this work, we introduce an adaptive importance sampler which mixes together the benefits of the Importance Sampling (IS) and Markov Chain Monte Carlo (MCMC) approaches. Different parallel MCMC chains provide the location parameters of the proposal probability density functions (pdfs) used in an IS method. The MCMC algorithms consider a tempered version of the posterior distribution as invariant density. We also provide an exhaustive theoretical support explaining why, in the presented technique, even an anti-tempering strategy (reducing the scaling of the posterior) can …

Mathematical optimizationRejection samplingSlice sampling020206 networking & telecommunicationsMarkov chain Monte Carlo02 engineering and technology01 natural sciencesStatistics::ComputationHybrid Monte Carlo010104 statistics & probabilitysymbols.namesakeMetropolis–Hastings algorithm[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineeringsymbolsParallel tempering0101 mathematicsParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance samplingComputingMilieux_MISCELLANEOUSMathematics
researchProduct

Monte-Carlo Methods

2003

The article conbtains sections titled: 1 Introduction and Overview 2 Random-Number Generation 2.1 General Introduction 2.2 Properties That a Random-Number Generator (RNG) Should Have 2.3 Comments about a Few Frequently Used Generators 3 Simple Sampling of Probability Distributions Using Random Numbers 3.1 Numerical Estimation of Known Probability Distributions 3.2 “Importance Sampling” versus “Simple Sampling” 3.3 Monte-Carlo as a Method of Integration 3.4 Infinite Integration Space 3.5 Random Selection of Lattice Sites 3.6 The Self-Avoiding Walk Problem 3.7 Simple Sampling versus Biased Sampling: the Example of SAWs Continued 4 Survey of Applications to Simulation of Transport Processes 4.…

Rejection samplingMonte Carlo methodSlice samplingSampling (statistics)Monte Carlo method in statistical physicsStatistical physicsStatistical mechanicsUmbrella samplingImportance samplingMathematics
researchProduct

Adaptive Metropolis algorithm using variational Bayesian adaptive Kalman filter

2013

Markov chain Monte Carlo (MCMC) methods are powerful computational tools for analysis of complex statistical problems. However, their computational efficiency is highly dependent on the chosen proposal distribution, which is generally difficult to find. One way to solve this problem is to use adaptive MCMC algorithms which automatically tune the statistics of a proposal distribution during the MCMC run. A new adaptive MCMC algorithm, called the variational Bayesian adaptive Metropolis (VBAM) algorithm, is developed. The VBAM algorithm updates the proposal covariance matrix using the variational Bayesian adaptive Kalman filter (VB-AKF). A strong law of large numbers for the VBAM algorithm is…

Statistics and ProbabilityMathematical optimizationCovariance matrixApplied MathematicsBayesian probabilityRejection samplingMathematics - Statistics TheoryMarkov chain Monte CarloStatistics Theory (math.ST)Kalman filterStatistics::ComputationComputational Mathematicssymbols.namesakeComputingMethodologies_PATTERNRECOGNITIONMetropolis–Hastings algorithmComputational Theory and MathematicsConvergence (routing)FOS: MathematicsKernel adaptive filtersymbolsMathematicsComputational Statistics & Data Analysis
researchProduct